- Title
- Language chunking, data sparseness, and the value of a long marker list: explorations with word n-grams and authorial attribution
- Creator
- Antonia, Alexis; Craig, Hugh; Elliott, Jack
- Relation
- Literary and Linguistic Computing Vol. 29, Issue 2, p. 147-163
- Publisher Link
- http://dx.doi.org/10.1093/llc/fqt028
- Publisher
- Oxford University Press
- Resource Type
- journal article
- Date
- 2014
- Description
- The frequencies of individual words have been the mainstay of computer-assisted authorial attribution over the past three decades. The usefulness of this sort of data is attested in many benchmark trials and in numerous studies of particular authorship problems. It is sometimes argued, however, that since language as spoken or written falls into word sequences, on the 'idiom principle', and since language is characteristically produced in the brain in chunks, not in individual words, n-grams with n higher than 1 are superior to individual words as a source of authorship markers. In this article, we test the usefulness of word n-grams for authorship attribution by asking how many good-quality authorship markers are yielded by n-grams of various types, namely 1-grams, 2-grams, 3-grams, 4-grams, and 5-grams. We use two ways of formulating the n-grams, two corpora of texts, and two methods for finding and assessing markers. We find that when using methods based on regularly occurring markers, and drawing on all the available vocabulary, 1-grams perform best. With methods based on rare markers, and all the available vocabulary, strict 3-gram sequences perform best. If we restrict ourselves to a defined word-list of function-words to form n-grams, 2-grams offer a striking improvement on 1-grams.
- Subject
- authorial attribution; n-grams; word frequency; authorial markers
- Identifier
- http://hdl.handle.net/1959.13/1302892
- Identifier
- uon:20570
- Identifier
- ISSN:0268-1145
- Language
- eng
- Reviewed
- Hits: 2236
- Visitors: 2478
- Downloads: 0
Thumbnail | File | Description | Size | Format |
---|